Numerical CP decomposition of some difficult tensors
نویسندگان
چکیده
In this paper, a numerical method is proposed for canonical polyadic (CP) decomposition of small size tensors. The focus is primarily on decomposition of tensors that correspond to small matrix multiplications. Here, rank of the tensors is equal to the smallest number of scalar multiplications that are necessary to accomplish the matrix multiplication. The proposed method is based on a constrained Levenberg-Marquardt optimization. Numerical results indicate the rank and border ranks of tensors that correspond to multiplication of matrices of the size 2× 3 and 3× 2, 3× 3 and 3× 2, 3× 3 and 3×3, and 3×4 and 4×3. The ranks are 11, 15, 23 and 29, respectively. In particular, a novel algorithm for multiplying the matrices of the sizes 3×3 and 3× 2 with 15 multiplications is presented.
منابع مشابه
Completely Positive Tensors: Properties, Easily Checkable Subclasses, and Tractable Relaxations
The completely positive (CP) tensor verification and decomposition are essential in tensor analysis and computation due to the wide applications in statistics, computer vision, exploratory multiway data analysis, blind source separation, and polynomial optimization. However, it is generally NP-hard as we know from its matrix case. To facilitate the CP tensor verification and decomposition, more...
متن کاملProbabilistic inference with noisy-threshold models based on a CP tensor decomposition
The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponen...
متن کاملTensor Decompositions and Applications
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N-way array. Decompositions of higher-order tensors (i.e., N-way arrays with N ≥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, grap...
متن کاملNew Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization
In this paper, we propose three new tensor decompositions for even-order tensors corresponding respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds between these new te...
متن کاملVectorial Dimension Reduction for Tensors Based on Bayesian Inference
Dimensionality reduction for high-order tensors is a challenging problem. In conventional approaches, higher order tensors are “vectorized” via Tucker decomposition to obtain lower order tensors. This will destroy the inherent high-order structures or resulting in undesired tensors, respectively. This paper introduces a probabilistic vectorial dimensionality reduction model for tensorial data. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- J. Computational Applied Mathematics
دوره 317 شماره
صفحات -
تاریخ انتشار 2017